95 research outputs found

    Bulk flow scaling for turbulent channel and pipe flows

    Full text link
    We report a theory deriving bulk flow scaling for canonical wall-bounded flows. The theory accounts for the symmetries of boundary geometry (flat plate channel versus circular pipe) by a variational calculation for a large-scale energy length, which characterizes its bulk flow scaling by a simple exponent, i.e. m=4m=4 for channel and 5 for pipe. The predicted mean velocity shows excellent agreement with several dozen sets of quality empirical data for a wide range of the Reynolds number (Re), with a universal bulk flow constant ΞΊβ‰ˆ0.45\kappa\approx0.45. Predictions for dissipation and turbulent transport in the bulk flow are also given, awaiting data verification.Comment: 4 pages, 4 figure

    A dynamic Bayesian network approach to protein secondary structure prediction

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Protein secondary structure prediction method based on probabilistic models such as hidden Markov model (HMM) appeals to many because it provides meaningful information relevant to sequence-structure relationship. However, at present, the prediction accuracy of pure HMM-type methods is much lower than that of machine learning-based methods such as neural networks (NN) or support vector machines (SVM).</p> <p>Results</p> <p>In this paper, we report a new method of probabilistic nature for protein secondary structure prediction, based on dynamic Bayesian networks (DBN). The new method models the PSI-BLAST profile of a protein sequence using a multivariate Gaussian distribution, and simultaneously takes into account the dependency between the profile and secondary structure and the dependency between profiles of neighboring residues. In addition, a segment length distribution is introduced for each secondary structure state. Tests show that the DBN method has made a significant improvement in the accuracy compared to other pure HMM-type methods. Further improvement is achieved by combining the DBN with an NN, a method called DBNN, which shows better <it>Q</it><sub>3 </sub>accuracy than many popular methods and is competitive to the current state-of-the-arts. The most interesting feature of DBN/DBNN is that a significant improvement in the prediction accuracy is achieved when combined with other methods by a simple consensus.</p> <p>Conclusion</p> <p>The DBN method using a Gaussian distribution for the PSI-BLAST profile and a high-ordered dependency between profiles of neighboring residues produces significantly better prediction accuracy than other HMM-type probabilistic methods. Owing to their different nature, the DBN and NN combine to form a more accurate method DBNN. Future improvement may be achieved by combining DBNN with a method of SVM type.</p

    A Spike-Timing Pattern Based Neural Network Model for the Study of Memory Dynamics

    Get PDF
    It is well accepted that the brain's computation relies on spatiotemporal activity of neural networks. In particular, there is growing evidence of the importance of continuously and precisely timed spiking activity. Therefore, it is important to characterize memory states in terms of spike-timing patterns that give both reliable memory of firing activities and precise memory of firing timings. The relationship between memory states and spike-timing patterns has been studied empirically with large-scale recording of neuron population in recent years. Here, by using a recurrent neural network model with dynamics at two time scales, we construct a dynamical memory network model which embeds both fast neural and synaptic variation and slow learning dynamics. A state vector is proposed to describe memory states in terms of spike-timing patterns of neural population, and a distance measure of state vector is defined to study several important phenomena of memory dynamics: partial memory recall, learning efficiency, learning with correlated stimuli. We show that the distance measure can capture the timing difference of memory states. In addition, we examine the influence of network topology on learning ability, and show that local connections can increase the network's ability to embed more memory states. Together theses results suggest that the proposed system based on spike-timing patterns gives a productive model for the study of detailed learning and memory dynamics
    • …
    corecore